346 research outputs found

    Earthquake forecasting and its verification

    Get PDF
    No proven method is currently available for the reliable short time prediction of earthquakes (minutes to months). However, it is possible to make probabilistic hazard assessments for earthquake risk. These are primarily based on the association of small earthquakes with future large earthquakes. In this paper we discuss a new approach to earthquake forecasting. This approach is based on a pattern informatics (PI) method which quantifies temporal variations in seismicity. The output is a map of areas in a seismogenic region (``hotspots'') where earthquakes are forecast to occur in a future 10-year time span. This approach has been successfully applied to California, to Japan, and on a worldwide basis. These forecasts are binary--an earthquake is forecast either to occur or to not occur. The standard approach to the evaluation of a binary forecast is the use of the relative operating characteristic (ROC) diagram, which is a more restrictive test and less subject to bias than maximum likelihood tests. To test our PI method, we made two types of retrospective forecasts for California. The first is the PI method and the second is a relative intensity (RI) forecast based on the hypothesis that future earthquakes will occur where earthquakes have occurred in the recent past. While both retrospective forecasts are for the ten year period 1 January 2000 to 31 December 2009, we performed an interim analysis 5 years into the forecast. The PI method out performs the RI method under most circumstances.Comment: 10(+1) pages, 5 figures, 2 tables. Submitted to Nonlinearl Processes in Geophysics on 5 August 200

    Space-Time Clustering and Correlations of Major Earthquakes

    Get PDF
    Earthquake occurrence in nature is thought to result from correlated elastic stresses, leading to clustering in space and time. We show that occurrence of major earthquakes in California correlates with time intervals when fluctuations in small earthquakes are suppressed relative to the long term average. We estimate a probability of less than 1% that this coincidence is due to random clustering.Comment: 5 pages, 3 figures. Submitted to PR

    Systematic procedural and sensitivity analysis of the pattern informatics method for forecasting large (M > 5) earthquake events in southern California

    Full text link
    Recent studies in the literature have introduced a new approach to earthquake forecasting based on representing the space-time patterns of localized seismicity by a time-dependent system state vector in a real-valued Hilbert space and deducing information about future space-time fluctuations from the phase angle of the state vector. While the success rate of this Pattern Informatics (PI) method has been encouraging, the method is still in its infancy. Procedural analysis, statistical testing, parameter sensitivity investigation and optimization all still need to be performed. In this paper, we attempt to optimize the PI approach by developing quantitative values for "predictive goodness" and analyzing possible variations in the proposed procedure. In addition, we attempt to quantify the systematic dependence on the quality of the input catalog of historic data and develop methods for combining catalogs from regions of different seismic rates.Comment: 39 pages, 4 tables, 9 figures. Submitted to Pure and Applied Geophysics on 30 November 200

    Evaluating the RELM Test Results

    Get PDF
    We consider implications of the Regional Earthquake Likelihood Models (RELM) test results with regard to earthquake forecasting. Prospective forecasts were solicited for M≥4.95 earthquakes in California during the period 2006–2010. During this period 31 earthquakes occurred in the test region with M≥4.95. We consider five forecasts that were submitted for the test. We compare the forecasts utilizing forecast verification methodology developed in the atmospheric sciences, specifically for tornadoes. We utilize a “skill score” based on the forecast scores λfi of occurrence of the test earthquakes. A perfect forecast would have λfi=1, and a random (no skill) forecast would have λfi=2.86×10-3. The best forecasts (largest value of λfi) for the 31 earthquakes had values of λfi=1.24×10-1 to λfi=5.49×10-3. The best mean forecast for all earthquakes was λ̅f=2.84×10-2. The best forecasts are about an order of magnitude better than random forecasts. We discuss the earthquakes, the forecasts, and alternative methods of evaluation of the performance of RELM forecasts. We also discuss the relative merits of alarm-based versus probability-based forecasts

    A damage model based on failure threshold weakening

    Full text link
    A variety of studies have modeled the physics of material deformation and damage as examples of generalized phase transitions, involving either critical phenomena or spinodal nucleation. Here we study a model for frictional sliding with long range interactions and recurrent damage that is parameterized by a process of damage and partial healing during sliding. We introduce a failure threshold weakening parameter into the cellular-automaton slider-block model which allows blocks to fail at a reduced failure threshold for all subsequent failures during an event. We show that a critical point is reached beyond which the probability of a system-wide event scales with this weakening parameter. We provide a mapping to the percolation transition, and show that the values of the scaling exponents approach the values for mean-field percolation (spinodal nucleation) as lattice size LL is increased for fixed RR. We also examine the effect of the weakening parameter on the frequency-magnitude scaling relationship and the ergodic behavior of the model

    Anti-cancer effects and mechanism of actions of aspirin analogues in the treatment of glioma cancer

    Get PDF
    INTRODUCTION: In the past 25 years only modest advancements in glioma treatment have been made, with patient prognosis and median survival time following diagnosis only increasing from 3 to 7 months. A substantial body of clinical and preclinical evidence has suggested a role for aspirin in the treatment of cancer with multiple mechanisms of action proposed including COX 2 inhibition, down regulation of EGFR expression, and NF-κB signaling affecting Bcl-2 expression. However, with serious side effects such as stroke and gastrointestinal bleeding, aspirin analogues with improved potency and side effect profiles are being developed. METHOD: Effects on cell viability following 24 hr incubation of four aspirin derivatives (PN508, 517, 526 and 529) were compared to cisplatin, aspirin and di-aspirin in four glioma cell lines (U87 MG, SVG P12, GOS – 3, and 1321N1), using the PrestoBlue assay, establishing IC50 and examining the time course of drug effects. RESULTS: All compounds were found to decrease cell viability in a concentration and time dependant manner. Significantly, the analogue PN517 (IC50 2mM) showed approximately a twofold increase in potency when compared to aspirin (3.7mM) and cisplatin (4.3mM) in U87 cells, with similar increased potency in SVG P12 cells. Other analogues demonstrated similar potency to aspirin and cisplatin. CONCLUSION: These results support the further development and characterization of novel NSAID derivatives for the treatment of glioma
    • …
    corecore